Varying irrelevant phonetic features hinders learning of the feature being trained

نویسندگان
چکیده

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Learning with Many Irrelevant Features

In many domains, an appropriate inductive bias is the MIN-FEATURES bias, which prefers consistent hypotheses deenable over as few features as possible. This paper deenes and studies this bias. First, it is shown that any learning algorithm implementing the MIN-FEATURES bias requires (1 ln 1 + 1 2 p + p lnn]) training examples to guarantee PAC-learning a concept having p relevant features out of...

متن کامل

Learning phonetic features from waveforms

Unsupervised learning of broad phonetic classes by infants was simulated using a statistical mixture model. With the phonetic labels removed, hand-transcribed segments from the TIMIT database were used in model-based clustering to obtain data-driven classes. Simple Hidden Markov Models were chosen to be the components of the mixture, with Mel-Cepstral coefficients as the front-end. The sound cl...

متن کامل

the impact of skopos on syntactic features of the target text

the present study is an experimental case study which investigates the impacts, if any, of skopos on syntactic features of the target text. two test groups each consisting of 10 ma students translated a set of sentences selected from advertising texts in the operative and informative mode. the resulting target texts were then statistically analyzed in terms of the number of words, phrases, si...

15 صفحه اول

Learning Boolean Concepts in the Presence of Many Irrelevant Features

In many domains, an appropriate inductive bias is the MIN-FEATURES bias, which prefers consistent hypotheses de nable over as few features as possible. This paper de nes and studies this bias in Boolean domains. First, it is shown that any learning algorithm implementing the MIN-FEATURES bias requires (1 ln 1 + 1 [2p + p lnn]) training examples to guarantee PAC-learning a concept having p relev...

متن کامل

On Feature Selection: Learning with Exponentially Many Irrelevant Features as Training Examples

We consider feature selection in the \wrap-per" model of feature selection. This typically involves an NP-hard optimization problem that is approximated by heuristic search for a \good" feature subset. First considering the idealization where this optimization is performed exactly, we give a rigorous bound for generalization error under feature selection. The search heuristics typically used ar...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: The Journal of the Acoustical Society of America

سال: 2016

ISSN: 0001-4966

DOI: 10.1121/1.4939736